Variational Gram Functions: Convex Analysis and Optimization
نویسندگان
چکیده
We propose a new class of convex penalty functions, called variational Gram functions (VGFs), that can promote pairwise relations, such as orthogonality, among a set of vectors in a vector space. These functions can serve as regularizers in convex optimization problems arising from hierarchical classification, multitask learning, and estimating vectors with disjoint supports, among other applications. We study convexity for VGFs, and give efficient characterizations for their convex conjugates, subdifferentials, and proximal operators. We discuss efficient optimization algorithms for regularized loss minimization problems where the loss admits a common, yet simple, variational representation and the regularizer is a VGF. These algorithms enjoy a simple kernel trick, an efficient line search, as well as computational advantages over first order methods based on the subdifferential or proximal maps. We also establish a general representer theorem for such learning problems. Lastly, numerical experiments on a hierarchical classification problem are presented to demonstrate the effectiveness of VGFs and the associated optimization algorithms.
منابع مشابه
Vector Optimization Problems and Generalized Vector Variational-Like Inequalities
In this paper, some properties of pseudoinvex functions, defined by means of limiting subdifferential, are discussed. Furthermore, the Minty vector variational-like inequality, the Stampacchia vector variational-like inequality, and the weak formulations of these two inequalities defined by means of limiting subdifferential are studied. Moreover, some relationships between the vector vari...
متن کاملSequential Optimality Conditions and Variational Inequalities
In recent years, sequential optimality conditions are frequently used for convergence of iterative methods to solve nonlinear constrained optimization problems. The sequential optimality conditions do not require any of the constraint qualications. In this paper, We present the necessary sequential complementary approximate Karush Kuhn Tucker (CAKKT) condition for a point to be a solution of a ...
متن کامل$(varphi_1, varphi_2)$-variational principle
In this paper we prove that if $X $ is a Banach space, then for every lower semi-continuous bounded below function $f, $ there exists a $left(varphi_1, varphi_2right)$-convex function $g, $ with arbitrarily small norm, such that $f + g $ attains its strong minimum on $X. $ This result extends some of the well-known varitional principles as that of Ekeland [On the variational principle, J. Ma...
متن کاملContinuous essential selections and integral functionals
Given a strictly positive measure, we characterize inner semicontinuous solid convex-valued mappings for which continuous functions which are selections almost everywhere are selections. This class contains continuous mappings as well as fully lower semicontinuous closed convex-valued mappings that arise in variational analysis and optimization of integral functionals. The characterization allo...
متن کاملOrthogonal Invariance and Identifiability
Matrix variables are ubiquitous in modern optimization, in part because variational properties of useful matrix functions often expedite standard optimization algorithms. Convexity is one important such property: permutation-invariant convex functions of the eigenvalues of a symmetric matrix are convex, leading to the wide applicability of semidefinite programming algorithms. We prove the analo...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- SIAM Journal on Optimization
دوره 27 شماره
صفحات -
تاریخ انتشار 2017